On Subgradient Projectors
نویسندگان
چکیده
The subgradient projector is of considerable importance in convex optimization because it plays the key role in Polyak’s seminal work — and the many papers it spawned — on subgradient projection algorithms for solving convex feasibility problems. In this paper, we offer a systematic study of the subgradient projector. Fundamental properties such as continuity, nonexpansiveness, and monotonicity are investigated. We also discuss the Yamagishi–Yamada operator. Numerous examples illustrate our results. 2010 Mathematics Subject Classification: Primary 90C25; Secondary 47H04, 47H05, 47H09, 47N10.
منابع مشابه
A new Levenberg-Marquardt approach based on Conjugate gradient structure for solving absolute value equations
In this paper, we present a new approach for solving absolute value equation (AVE) whichuse Levenberg-Marquardt method with conjugate subgradient structure. In conjugate subgradientmethods the new direction obtain by combining steepest descent direction and the previous di-rection which may not lead to good numerical results. Therefore, we replace the steepest descentdir...
متن کاملConvergence Rates for Deterministic and Stochastic Subgradient Methods Without Lipschitz Continuity
We extend the classic convergence rate theory for subgradient methods to apply to non-Lipschitz functions. For the deterministic projected subgradient method, we present a global O(1/ √ T ) convergence rate for any convex function which is locally Lipschitz around its minimizers. This approach is based on Shor’s classic subgradient analysis and implies generalizations of the standard convergenc...
متن کاملSubgradient Methods for Saddle-Point Problems
We consider computing the saddle points of a convex-concave function using subgradient methods. The existing literature on finding saddle points has mainly focused on establishing convergence properties of the generated iterates under some restrictive assumptions. In this paper, we propose a subgradient algorithm for generating approximate saddle points and provide per-iteration convergence rat...
متن کاملIncremental Stochastic Subgradient Algorithms for Convex Optimization
This paper studies the effect of stochastic errors on two constrained incremental subgradient algorithms. The incremental subgradient algorithms are viewed as decentralized network optimization algorithms as applied to minimize a sum of functions, when each component function is known only to a particular agent of a distributed network. First, the standard cyclic incremental subgradient algorit...
متن کاملOn the Efficiency of the €-subgradient Methods over Nonlinearly Constrained Networks
The efficiency of the network flow techniques can be exploited in the solution of nonlinearly constrained network flow problems by means of approximate subgradient methods. In particular, we consider the case where the side constraints (non-network constraints) are convex. We propose to solve the dual problem by using &-subgradient methods given that the dual function is estimated by minimizing...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- SIAM Journal on Optimization
دوره 25 شماره
صفحات -
تاریخ انتشار 2015